AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
Context-aware translation

# Context-aware translation

Towerinstruct 13B V0.1 GGUF
TowerInstruct-13B is a 13-billion-parameter language model fine-tuned from the TowerBase model on the TowerBlocks supervised fine-tuning dataset, specifically designed for handling various translation-related tasks.
Large Language Model Supports Multiple Languages
T
LoneStriker
57
9
Towerinstruct 7B V0.2
TowerInstruct-7B-v0.2 is a 7-billion-parameter multilingual large language model focused on translation-related tasks, supporting 10 languages.
Large Language Model Transformers Supports Multiple Languages
T
Unbabel
5,003
35
Scat Marian Small Target Ctx4 Cwd0 En Fr
Apache-2.0
This model is based on Helsinki-NLP/opus-mt-en-fr, further trained for English-French translation with context tag format on the IWSLT17 dataset, and fine-tuned on the SCAT+ training set.
Machine Translation Transformers Supports Multiple Languages
S
context-mt
39
1
Scat Marian Small Ctx4 Cwd1 En Fr
Apache-2.0
This model is based on Helsinki-NLP/opus-mt-en-fr, further trained on the IWSLT17 dataset using context tag format for English-French translation, and fine-tuned on the SCAT+ dataset. Suitable for both context-aware and context-free English-French translation tasks.
Machine Translation Transformers Supports Multiple Languages
S
context-mt
42
0
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase